Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
English fine-tuned base model
# English fine-tuned base model
Qwen2 96M
Apache-2.0
Qwen2-96M is a miniature language model based on the Qwen2 architecture, containing 96 million parameters and supporting a context length of 8192 tokens, suitable for English text generation tasks.
Large Language Model
English
Q
Felladrin
76
2
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase